27 MAR 2023 by ideonexus

 LLMs are Lossy Compression for the Entire WWW

To grasp the proposed relationship between compression and understanding, imagine that you have a text file containing a million examples of addition, subtraction, multiplication, and division. Although any compression algorithm could reduce the size of this file, the way to achieve the greatest compression ratio would probably be to derive the principles of arithmetic and then write the code for a calculator program. Using a calculator, you could perfectly reconstruct not just the million ex...
Folksonomies: ai llm large language model
Folksonomies: ai llm large language model
  1  notes